Skip to content

Conversation

@processtrader
Copy link
Contributor

Summary

  • Add a dedicated system prompt for GLM models (glm.txt).
  • Wire GLM models to use this new prompt instead of the generic Qwen/fallback prompt.
    Background
    Previously, SystemPrompt.provider did not have any GLM-specific handling.
    Any model whose ID did not match the GPT, Gemini, Claude, or Polaris branches fell through to the default:
    return [PROMPT_ANTHROPIC_WITHOUT_TODO] // imported from "./prompt/qwen.txt"
    As a result, GLM sessions were using the qwen.txt system prompt rather than a GLM-optimized one.
    Changes
  1. New GLM prompt
    • Added packages/opencode/src/session/prompt/glm.txt.
    • This prompt:
      • Positions opencode as an interactive CLI assistant.
      • Emphasizes GLM-4.6’s reasoning and analytical strengths.
      • Reinforces security constraints (no malicious code).
      • Encourages structured workflows using TodoWrite, WebFetch, and the existing toolchain.
      • Defines concise communication style and guidance for help/feedback.
  2. System prompt routing for GLM
    • Updated packages/opencode/src/session/system.ts:

      • Imported the new prompt:
        import PROMPT_GLM from "./prompt/glm.txt"

      • Adjusted SystemPrompt.provider to detect GLM models and return the GLM prompt

    • Matching is done on "glm" (case-insensitive) rather than a specific version, so all GLM variants benefit from the new prompt.
      Behavior / Impact

  • GLM-based sessions now receive GLM-specific system instructions tailored to:
    • Software engineering workflows in the opencode environment.
    • Stronger task planning, research, and tool usage guidance.
  • Other providers and models retain their existing prompts:
    • GPT-* and o1/o3 → beast.txt / codex.txt as before.
    • Gemini → gemini.txt
    • Claude → anthropic.txt
    • Polaris → polaris.txt
    • All other non-GLM, non-GPT/Gemini/Claude/Polaris models → qwen.txt (unchanged).

Testing

  • bun turbo typecheck (runs typecheck across all packages) – pass.
  • Manual verification recommended:
    • Start a session with a GLM model (model ID contains "glm").
      • Confirm that behavior aligns with the new GLM system prompt (e.g., references to TodoWrite, WebFetch, and GLM’s reasoning capabilities).
    • Start a session with non-GLM models and confirm their prompts are unchanged.

@processtrader
Copy link
Contributor Author

Closed, as this will be covered on #4710

@processtrader processtrader deleted the add-glm-system-prompt branch November 24, 2025 22:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants